71,779 research outputs found

    Prospects for gamma measurements at LHCb

    Full text link
    LHCb is the dedicated B physics experiment at the LHC and is due to start data taking later this year. Its goal is to search for new physics in very rare processes and make precision measurements of CP violation in B decays. The CKM angle gamma plays an important role in flavour physics in the Standard Model. LHCb will exploit the large variety of B hadrons produced by the 14 TeV pp collisions, performing gamma measurements to the precision of a few degrees. Here, we will present a summary of the expected gamma sensitivities LHCb will reach during its first years of data taking, with contributions from several strategies in both tree and loop processes.Comment: 4 pages, 3 figures. For proceedings of 2009 Lake Louise Winter Institute, Alberta, Canad

    Optimizing Higgs factories by modifying the recoil mass

    Full text link
    It is difficult to measure the WWWW-fusion Higgs production process (e+e−→ννˉhe^+e^- \to \nu \bar{\nu} h) at a lepton collider with a center of mass energy of 240-250 GeV due to its small rate and the large background from the Higgsstrahlung process with an invisible ZZ (e+e−→hZ, Z→ννˉe^+e^- \to hZ,\,Z\to \nu\bar{\nu}). We construct a modified recoil mass variable, mrecoilpm^p_{\rm recoil}, defined using only the 3-momentum of the reconstructed Higgs particle, and show that it can better separate the WWWW-fusion and Higgsstrahlung events than the original recoil mass variable mrecoilm_{\rm recoil}. Consequently, the mrecoilpm^p_{\rm recoil} variable can be used to improve the overall precisions of the extracted Higgs couplings, in both the conventional framework and the effective-field-theory framework. We also explore the application of the mrecoilpm^p_{\rm recoil} variable in the inclusive cross section measurements of the Higgsstrahlung process, while a quantitive analysis is left for future studies.Comment: 25 pages, 8 figure

    Generalized Stability of Heisenberg Coefficients

    Get PDF
    Stembridge introduced the notion of stability for Kronecker triples which generalize Murnaghan's classical stability result for Kronecker coefficients. Sam and Snowden proved a conjecture of Stembridge concerning stable Kronecker triple, and they also showed an analogous result for Littlewood--Richardson coefficients. Heisenberg coefficients are Schur structure constants of the Heisenberg product which generalize both Littlewood--Richardson coefficients and Kronecker coefficients. We show that any stable triple for Kronecker coefficients or Littlewood--Richardson coefficients also stabilizes Heisenberg coefficients, and we classify the triples stabilizing Heisenberg coefficients. We also follow Vallejo's idea of using matrix additivity to generate Heisenberg stable triples.Comment: 13 page

    A two step model for linear prediction, with connections to PLS

    Get PDF
    In the thesis, we consider prediction of a univariate response variable, especially when the explanatory variables are almost collinear. A two step approach has been proposed. The first step is to summarize the information in the explanatory variables via a bilinear model with a Krylov structured design matrix. The second step is the prediction step where a conditional predictor is applied. The two step approach gives us a new insight in partial least squares regression (PLS). Explicit maximum likelihood estimators of the variances and mean for the explanatory variables are derived. It is shown that the mean square error of the predictor in the two step model is always smaller than the one in PLS. Moreover, the two step model has been extended to handle grouped data. A real data set is analyzed to illustrate the performance of the two step approach and to compare it with other regularized methods

    Noise threshold and resource cost of fault-tolerant quantum computing with Majorana fermions in hybrid systems

    Full text link
    Fault-tolerant quantum computing in systems composed of both Majorana fermions and topologically unprotected quantum systems, e.g. superconducting circuits or quantum dots, is studied in this paper. Errors caused by topologically unprotected quantum systems need to be corrected with error correction schemes, for instance, the surface code. We find that the error-correction performance of such a hybrid topological quantum computer is not superior to a normal quantum computer unless the topological charge of Majorana fermions is insusceptible to noise. If errors changing the topological charge are rare, the fault-tolerance threshold is much higher than the threshold of a normal quantum computer, and a surface-code logical qubit could be encoded in only tens of topological qubits instead of about a thousand normal qubits.Comment: 15 pages, 11 figure
    • …
    corecore